Bias Reduction for k – Sample Functionals

نویسندگان

  • Christopher S. Withers
  • Saralees Nadarajah
چکیده

We give analytic methods for nonparametric bias reduction that remove the need for computationally intensive methods like the bootstrap and the jackknife. We call an estimate pth order if its bias has magnitude n 0 as n0 → ∞, where n0 is the sample size (or the minimum sample size if the estimate is a function of more than one sample). Most estimates are only first order and require O(N) calculations, where N is the total sample size. The usual bootstrap and jackknife estimates are second order but they are computationally intensive, requiring O(N2) calculations for one sample. By contrast Jaeckel’s infinitesimal jackknife is an analytic second order one sample estimate requiring only O(N) calculations. When pth order bootstrap and jackknife estimates are available, they require O(Np) calculations, and so become even more computationally intensive if one chooses p > 2. For general p we provide analytic pth order nonparametric estimates that require only O(N) calculations. Our estimates are given in terms of the von Mises derivatives of the functional being estimated, evaluated at the empirical distribution. For products of moments an unbiased estimate exists: our form for this “polykay” is much simpler than the usual form in terms of power sums.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Equivalence of K-functionals and modulus of smoothness for fourier transform

In Hilbert space L2(Rn), we prove the equivalence between the mod-ulus of smoothness and the K-functionals constructed by the Sobolev space cor-responding to the Fourier transform. For this purpose, Using a spherical meanoperator.

متن کامل

Analytic Bias Reduction for k-Sample Functionals

We give analytic methods for nonparametric bias reduction that remove the need for computationally intensive methods like the bootstrap and the jackknife. We call an estimate pth order if its bias has magnitude n 0 as n0 → ∞, where n0 is the sample size (or the minimum sample size if the estimate is a function of more than one sample). Most estimates are only first order and require O(N) calcul...

متن کامل

Finite-Sample Analysis of Fixed-k Nearest Neighbor Density Functional Estimators

We provide finite-sample analysis of a general framework for using k-nearest neighbor statistics to estimate functionals of a nonparametric continuous probability density, including entropies and divergences. Rather than plugging a consistent density estimate (which requires k → ∞ as the sample size n → ∞) into the functional of interest, the estimators we consider fix k and perform a bias corr...

متن کامل

The Relative Improvement of Bias Reduction in Density Estimator Using Geometric Extrapolated Kernel

One of a nonparametric procedures used to estimate densities is kernel method. In this paper, in order to reduce bias of  kernel density estimation, methods such as usual kernel(UK), geometric extrapolation usual kernel(GEUK), a bias reduction kernel(BRK) and a geometric extrapolation bias reduction kernel(GEBRK) are introduced. Theoretical properties, including the selection of smoothness para...

متن کامل

An Application of Non-response Bias Reduction Using Propensity Score Methods

‎In many statistical studies some units do not respond to a number or all of the questions‎. ‎This situation causes a problem called non-response‎. ‎Bias and variance inflation are two important consequences of non-response in surveys‎. ‎Although increasing the sample size can prevented variance inflation‎, ‎but cannot necessary adjust for the non-response bias‎. ‎Therefore a number of methods ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009